Get up and running with large language models locally.
DetailsDiscover, download, and run local LLMs
DetailsStay connected and chat seamlessly across all of your devices
DetailsChat with ALL AI Bots Concurrently, Discover the Best
DetailsYour AI Copilot on the Desktop
DetailsTurn your computer into an AI computer
DetailsA desktop client that supports for multiple LLM providers
DetailsThe simplest way to run Alpaca (and other LLaMA-based local LLMs) on your own computer
DetailsLLM inference in C/C++
DetailsRun AI models locally. One-click install w/ zero configuration.
DetailsDesktop AI Assistant / Universal MCP Client
DetailsA better AI (LLM) client. Full-featured, lightweight. Support multiple workspaces, plugin system, cross-platform, local first + real-time cloud sync, Artifacts, MCP
DetailsPrivate & local AI personal knowledge management app.
DetailsA Flutter LLM Chat Client. Support Android & iOS & Harmony Next.
DetailsDeepChat 是一个多平台 AI 客户端,致力于让更多人享受 AI 带来的便利。
DetailsNeatChat: A simplified and optimized fork of NextChat
DetailsAll-in-one LLM CLI tool featuring Shell Assistant, Chat-REPL, RAG, AI tools & agents, with access to OpenAI, Claude, Gemini, Ollama, Groq, and more.
DetailsThe easiest way to use local and online AI models
DetailsYour Intelligent Assistant for AI PC
DetailsA Chat client that strives for openness, utilizing APIs from various LLMs to achieve the best Chat experience, as well as implementing productivity tools through the MCP protocol.
DetailsRun AI models on your desktop with one click.
DetailsA portable terminal AI interface
DetailsAn open-source, modern-design AI chat framework. Supports Multi AI Providers (OpenAI / Claude 4 / Gemini / Ollama / DeepSeek / Qwen), Knowledge Base (file upload / knowledge management / RAG), Multi-Modals (Plugins/Artifacts) and Thinking.
DetailsYour AI second brain. Self-hostable. Get answers from the web or your docs. Build custom agents, schedule automations, do deep research. Turn any online or local LLM into your personal, autonomous AI (gpt, claude, gemini, llama, qwen, mistral).
Details基于 Ollama 推理框架本地部署的 Agent 应用,实现 MCP 工具调用,短长期记忆等功能。
DetailsModel swapping for llama.cpp
DetailsDistribute and run LLMs with a single file.
DetailsYour versatile AI chat companion, compatible with leading LLMs.
Details开源、轻量级、现代化。本地 AI 对话客户端,支持多提供商与本地模型。
DetailsA Sleek AI Assistant & MCP Client
Details